Asynchronous Multiagent Primal-Dual Optimization
نویسندگان
چکیده
منابع مشابه
Asynchronous parallel primal-dual block update methods
Recent several years have witnessed the surge of asynchronous (async-) parallel computing methods due to the extremely big data involved in many modern applications and also the advancement of multi-core machines and computer clusters. In optimization, most works about async-parallel methods are on unconstrained problems or those with block separable constraints. In this paper, we propose an as...
متن کاملPrimal-Dual Optimization for Fluids
We apply a novel optimization scheme from the image processing and machine learning areas, a fast Primal-Dual method, to achieve controllable and realistic fluid simulations. While our method is generally applicable to many problems in fluid simulations, we focus on the two topics of fluid guiding and separating solid-wall boundary conditions. Each problem is posed as an optimization problem an...
متن کاملProximal Primal-Dual Optimization Methods
In the field of inverse problems, one of the main benefits which can be drawn from primal-dual optimization approaches is that they do not require any linear operator inversion. In addition, they allow to split a convex objective function in a sum of simpler terms which can be dealt with individually either through their proximity operator or through their gradient if they correspond to smooth ...
متن کاملPrimal-Dual Asynchronous Particle Swarm Optimization (pdAPSO) Algorithm For Self-Organized Flocking of Swarm Robots
This paper proposed a hybrid PSO algorithm that combines the Primal-Dual method with APSO algorithm to address the problem of swarm robotics flocking motion. This algorithm combines the explorative ability of APSO with the exploitative capacity of the Primal Dual Interior Point Method. We hypothesize that the fusion of the two algorithms (APSO and Primal Dual) offers a robust prospect of preven...
متن کاملDSCOVR: Randomized Primal-Dual Block Coordinate Algorithms for Asynchronous Distributed Optimization
Machine learning with big data often involves large optimization models. For distributed optimization over a cluster ofmachines, frequent communication and synchronization of allmodel parameters (optimization variables) can be very costly. A promising solution is to use parameter servers to store different subsets of the model parameters, and update them asynchronously at different machines usi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Automatic Control
سال: 2017
ISSN: 0018-9286,1558-2523
DOI: 10.1109/tac.2017.2662019